ai emotion detection
Speech and facial recognition combine to boost AI emotion detection
Researchers have combined speech and facial recognition data to improve the emotion detection abilities of AIs. The ability to recognise emotions is a longstanding goal of AI researchers. Accurate recognition enables things such as detecting tiredness at the wheel, anger which could lead to a crime being committed, or perhaps even signs of sadness/depression at suicide hotspots. Nuances in how people speak and move their facial muscles to express moods have presented a challenge. Detailed in a paper (PDF) on Arxiv, researchers at the University of Science and Technology of China in Hefei have made some progress.
- Asia > China > Anhui Province > Hefei (0.26)
- North America > United States > California (0.06)
- Europe > Netherlands > North Holland > Amsterdam (0.06)
AI emotion detection among next big things in marketing: Zenith
Emotion-recognition technology is emerging as one of the top AI trends this year, enabling brands to build on consumers' emotions to deliver customised services and products, according to a report publishd by Zenith. According to the report, Artificial Intelligence: Powering the Consumer Journey, the mood-sensing devices that consumers carry in their pockets--smartphones--and the continuous progression of customised triggers enabled by emotion-detection tech will allow brands to make more emotionally relevant recommendations and enhance programmatic advertising. Bentley has used such technology in the Bentley Inspirator, an app that concocts a Bentayga SUV for order based on a consumer's reaction to a film. The report also cites the ability of IOT household devices (such as Amazon's Alexa) to deliver services and recommendations tailored to consumers' emotions. "Human decisions are heavily influenced by emotion," said Hugo Pinto, IBM Interactive Experience's innovation officer, in an interview published in the Zenith report.